Dimensionality reduction for supervised learning
نویسنده
چکیده
Outline Motivation Dimensionality reduction Experimental setup Results Discussion References Outline Motivation Supervised learning High dimensionality Dimensionality reduction Principal component analysis Random projections Experimental setup Algorithms and datasets Procedure Results Discussion Outline Motivation Dimensionality reduction Experimental setup Results Discussion References Motivation: supervised learning We obtain a set T of n examples of a relation {(x, y)}: T = { For (x, y), think of x ∈ R d as the input and y as the label (positive or negative). Call T the training set. We want to infer a classifier˜f such that˜f (x) correctly predicts the label of all x, not just those in T. Outline Motivation Dimensionality reduction Experimental setup Results Discussion References Example n = 200 and d = 2.
منابع مشابه
مدل ترکیبی تحلیل مؤلفه اصلی احتمالاتی بانظارت در چارچوب کاهش بعد بدون اتلاف برای شناسایی چهره
In this paper, we first proposed the supervised version of probabilistic principal component analysis mixture model. Then, we consider a learning predictive model with projection penalties, as an approach for dimensionality reduction without loss of information for face recognition. In the proposed method, first a local linear underlying manifold of data samples is obtained using the supervised...
متن کاملGeneralization Bounds for Supervised Dimensionality Reduction
We introduce and study the learning scenario of supervised dimensionality reduction, which couples dimensionality reduction and a subsequent supervised learning step. We present new generalization bounds for this scenario based on a careful analysis of the empirical Rademacher complexity of the relevant hypothesis set. In particular, we show an upper bound on the Rademacher complexity that is i...
متن کاملCoupled dimensionality reduction and classification for supervised and semi-supervised multilabel learning
Coupled training of dimensionality reduction and classification is proposed previously to improve the prediction performance for single-label problems. Following this line of research, in this paper, we first introduce a novel Bayesian method that combines linear dimensionality reduction with linear binary classification for supervised multilabel learning and present a deterministic variational...
متن کاملSemi-supervised classification based on random subspace dimensionality reduction
Graph structure is vital to graph based semi-supervised learning. However, the problem of constructing a graph that reflects the underlying data distribution has been seldom investigated in semi-supervised learning, especially for high dimensional data. In this paper, we focus on graph construction for semisupervised learning and propose a novel method called Semi-Supervised Classification base...
متن کاملRough Set-based Dimensionality Reduction for Supervised and Unsupervised Learning
The curse of dimensionality is a damning factor for numerous potentially powerful machine learning techniques. Widely approved and otherwise elegant methodologies used for a number of different tasks ranging from classification to function approximation exhibit relatively high computational complexity with respect to dimensionality. This limits severely the applicability of such techniques to r...
متن کامل